9 research outputs found

    Slaughter weight rather than sex affects carcass cuts and tissue composition of Bisaro pigs

    Get PDF
    Carcass cuts and tissue composition were assessed in Bisaro pigs (n=64) from two sexes (31 gilts and 33 entire males) reared until three target slaughter body-weights (BW) means: 17 kg, 32 kg, and 79 kg. Dressing percentage and backfat thickness increased whereas carcass shrinkage decreased with increasing BW. Slaughter weight affected most of the carcass cut proportions, except shoulder and thoracic regions. Bone proportion decreased linearly with increasing slaughter BW, while intermuscular and subcutaneous adipose tissue depots increased concomitantly. Slaughter weight increased the subcutaneous adipose tissue proportion but this impaired intramuscular and intermuscular adipose tissues in the loin primal. The sex of the pigs minimally affected the carcass composition, as only the belly weight and the subcutaneous adipose tissue proportions were greater in gilts than in entire males. Light pigs regardless of sex are recommended to balance the trade-offs between carcass cuts and their non-edible compositional outcomes.Work included in the Portuguese PRODER research Project BISOPORC – Pork extensive production of Bísara breed, in two alternative systems: fattening on concentrate vs chesnut, Project PRODER SI I&DT Medida 4.1 “Cooperação para a Inovação”. The authors are grateful to Laboratory of Carcass and Meat Quality of Agriculture School of Polytechnic Institute of Bragança ‘Cantinho do Alfredo’. The authors are members of the MARCARNE network, funded by CYTED (ref. 116RT0503).info:eu-repo/semantics/publishedVersio

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    A 3D Geodatabase for Urban Underground Infrastructures: Implementation and Application to Groundwater Management in Milan Metropolitan Area

    No full text
    The recent rapid increase in urbanization has led to the inclusion of underground spaces in urban planning policies. Among the main subsurface resources, a strong interaction between underground infrastructures and groundwater has emerged in many urban areas in the last few decades. Thus, listing the underground infrastructures is necessary to structure an urban conceptual model for groundwater management needs. Starting from a municipal cartography (Open Data), thus making the procedure replicable, a GIS methodology was proposed to gather all the underground infrastructures into an updatable 3D geodatabase (GDB) for the metropolitan city of Milan (Northern Italy). The underground volumes occupied by three categories of infrastructures were included in the GDB: (a) private car parks, (b) public car parks and (c) subway lines and stations. The application of the GDB allowed estimating the volumes lying below groundwater table in four periods, detected as groundwater minimums or maximums from the piezometric trend reconstructions. Due to groundwater rising or local hydrogeological conditions, the shallowest, non-waterproofed underground infrastructures were flooded in some periods considered. This was evaluated in a specific pilot area and qualitatively confirmed by local press and photographic documentation reviews. The methodology emerged as efficient for urban planning, particularly for urban conceptual models and groundwater management plans definition

    Multivariate Time Series Clustering of Groundwater Quality Data to Develop Data-Driven Monitoring Strategies in a Historically Contaminated Urban Area

    No full text
    As groundwater quality monitoring networks have been expanded over the last decades, significant time series are now available. Therefore, a scientific effort is needed to explore innovative techniques for groundwater quality time series exploitation. In this work, time series exploratory analysis and time series cluster analysis are applied to groundwater contamination data with the aim of developing data-driven monitoring strategies. The study area is an urban area characterized by several superimposing historical contamination sources and a complex hydrogeological setting. A multivariate time series cluster analysis was performed on PCE and TCE concentrations data over a 10 years time span. The time series clustering was performed based on the Dynamic Time Warping method. The results of the clustering identified 3 clusters associated with diffuse background contamination and 7 clusters associated with local hotspots, characterized by specific time profiles. Similarly, a univariate time series cluster analysis was applied to Cr(VI) data, identifying 3 background clusters and 7 hotspots, including 4 singletons. The clustering outputs provided the basis for the implementation of data-driven monitoring strategies and early warning systems. For the clusters associated with diffuse background contaminations and those with constant trends, trigger levels were calculated with the 95° percentile, constituting future threshold values for early warnings. For the clusters with pluriannual trends, either oscillatory or monotonous, specific monitoring strategies were proposed based on trends’ directions. Results show that the spatio-temporal overview of the data variability obtained from the time series cluster analysis helped to extract relevant information from the data while neglecting measurements noise and uncertainty, supporting the implementation of a more efficient groundwater quality monitoring

    Quantifying Groundwater Infiltrations into Subway Lines and Underground Car Parks Using MODFLOW-USG

    No full text
    Urbanization is a worldwide process that recently has culminated in wider use of the subsurface, determining a significant interaction between groundwater and underground infrastructures. This can result in infiltrations, corrosion, and stability issues for the subsurface elements. Numerical models are the most applied tools to manage these situations. Using MODFLOW-USG and combining the use of Wall (HFB) and DRN packages, this study aimed at simulating underground infrastructures (i.e., subway lines and public car parks) and quantifying their infiltrations. This issue has been deeply investigated to evaluate water inrush during tunnel construction, but problems also occur with regard to the operation of tunnels. The methodology has involved developing a steady-state groundwater flow model, calibrated against a maximum groundwater condition, for the western portion of Milan city (Northern Italy, Lombardy Region). Overall findings pointed out that the most impacted areas are sections of subway tunnels already identified as submerged. This spatial coherence with historical information could act both as validation of the model and a step forward, as infiltrations resulting from an interaction with the water table were quantified. The methodology allowed for the improvement of the urban conceptual model and could support the stakeholders in adopting proper measures to manage the interactions between groundwater and the underground infrastructures

    System Performance and Cost Modelling in LHC computing

    No full text
    The increase in the scale of LHC computing expected for Run 3 and even more so for Run 4 (HL-LHC) over the next ten years will certainly require radical changes to the computing models and the data processing of the LHC experiments. Translating the requirements of the physics programmes into computing resource needs is a complicated process and subject to significant uncertainties. For this reason, WLCG has established a working group to develop methodologies and tools intended tocharacterise the LHC workloads, better understand their interaction with the computing infrastructure, calculate their cost in terms of resources and expenditure and assist experiments, sites and the WLCG project in the evaluation of their future choices. This working group started in November 2017 and has about 30 active participants representing experiments and sites. In this contribution we expose the activities, the results achieved and the future directions

    New developments in cost modeling for the LHC computing

    No full text
    The increase in the scale of LHC computing during Run 3 and Run 4 (HL-LHC) will certainly require radical changes to the computing models and the data processing of the LHC experiments. The working group established by WLCG and the HEP Software Foundation to investigate all aspects of the cost of computing and how to optimise them has continued producing results and improving our understanding of this process. In particular, experiments have developed more sophisticated ways to calculate their resource needs, we have a much more detailed process to calculate infrastructure costs. This includes studies on the impact of HPC and GPU based resources on meeting the computing demands. We have also developed and perfected tools to quantitatively study the performance of experiments workloads and we are actively collaborating with other activities related to data access, benchmarking and technology cost evolution. In this contribution we expose our recent developments and results and outline the directions of future work

    Plastic debris in lakes and reservoirs

    No full text
    Plastic debris is thought to be widespread in freshwater ecosystems globally. However, a lack of comprehensive and comparable data makes rigorous assessment of its distribution challenging. Here we present a standardized cross-national survey that assesses the abundance and type of plastic debris (>250 μm) in freshwater ecosystems. We sample surface waters of 38 lakes and reservoirs, distributed across gradients of geographical position and limnological attributes, with the aim to identify factors associated with an increased observation of plastics. We find plastic debris in all studied lakes and reservoirs, suggesting that these ecosystems play a key role in the plastic-pollution cycle. Our results indicate that two types of lakes are particularly vulnerable to plastic contamination: lakes and reservoirs in densely populated and urbanized areas and large lakes and reservoirs with elevated deposition areas, long water-retention times and high levels of anthropogenic influence. Plastic concentrations vary widely among lakes; in the most polluted, concentrations reach or even exceed those reported in the subtropical oceanic gyres, marine areas collecting large amounts of debris. Our findings highlight the importance of including lakes and reservoirs when addressing plastic pollution, in the context of pollution management and for the continued provision of lake ecosystem services

    A Roadmap for HEP Software and Computing R&D for the 2020s

    No full text
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade
    corecore